Purpose

This tool mentor describes how to report on the schedule run in performance testing using Rational LoadTest. Reporting helps you analyze the results of the performance tests.

Related Rational Unified Process activities:

Overview

If your schedule has completed successfully, LoadTest automatically runs Status and Performance reports against the data in the log and displays the report output. After you examine the output from these reports, you can either save or delete it.

LoadTest provides various types of reports designed to analyze the results of a schedule run. You can also define new reports. The following reports are available automatically:

Performance ù Displays the response times, and calculates the mean, standard deviation, and percentiles for each response time in the schedule run. The output groups responses by command ID and shows only those that passed. Response reports show both passed and failed responses.

Compare ù Compares the response times measured by Performance reports. After you have generated output from several performance reports, use the Compare report to compare a specific field.

Response ù Displays individual response times and whether a response has passed or failed. This report is useful for looking at data points for individual responses as well as trends in the data. The output shows each command ID individually and the status of the response. This report can also display resource data plotted on response data points.

Status ù Obtain a quick summary of which commands passed or failed. It displays the status of all VU emulation commands and SQABasic timer commands. If you have failures, you can run the Analog report to examine them.

Analog ù Examines errors in your run. The output shows the communication between the virtual user and the system-under-test. If you access a database, the output shows the database errors. If you need further details, run a Trace report.

Trace ù Examines any failures in detail. The output formats raw data from the logs, without performing statistical analysis. It provides information including the timestamps from each VU emulation command and SQABasic timer command, and the counts of data sent and received.

Usage ù Use to view cumulative response time and summary statistics, as well as throughput information for VU emulation commands only.

You can run a variety of reports on a schedule run.

  1. Run a Status report.
  2. Run a Performance report.
  3. Run other reports from the Report toolbar.

Also see the following pages for additional related information:

1.   Run a Status report.  To top of page

Status reports show how well the responses you actually received during playback correspond with the responses that you expected to receive. If the response you received is the same as expected, LoadTest considers that it has passed; other wise LoadTest considers it failed.

The Status report output contains the following information:

Cmd ID ù The command ID associated with the response.

NUM ù The number of responses corresponding to each command ID. This number is the sum of the numbers in the Passed and Failed columns.

Passed ù The number of passed responses for each command ID (that is, those that did not time out).

Failed ù The number of failed responses for each command ID that timed out (that is, the expected response was not received).

% Passed ù The percentage of responses that passed for that command ID.

% Failed ù The percentage of responses that failed for that command ID.

The above information is shown in columns. Then the last line of the report output lists the totals for each column.

The procedure for running a Status report is found in Chapter 9 of the Using Rational LoadTest manual, which can be found on the documentation CD.

2.   Run a Performance report.  To top of page

Performance reports display the response times observed during the schedule run. Performance report output shows the response times for the selected commands. It also provides the mean, standard deviation, and percentiles for response times.

Performance reports use the same input data as Response reports, and provide similar data sorting and filtering. However, Performance reports group responses with the same command ID, while Response reports show each command ID individually.

The Performance report output contains the following information, shown in columns:

Cmd ID ù The command ID associated with the response.

NUM ù The number of responses for each command ID.

MEAN ù The arithmetic mean of the response times of all responses for each command ID.

STD DEV ù The standard deviation of the response times of all responses for each command ID.

MIN ù The minimum response time of all the responses for each command ID.

50th, 70th, 80th, 90th, 95th ù The percentiles of the response times of all the responses for each command ID. So for example, if the 95th percentile of a certain command is 0.53, that means 95% of the responses are less than 0.53 seconds.

MAX ù The maximum response time of all the responses for each command ID.

The procedure for running a Performance report is found in Chapter 9 of the Using Rational LoadTest manual, which can be found on the documentation CD.

3.   Run other reports from the report toolbar.  To top of page

LoadTest has a report bar, which has a button for all the different types of reports you can run on a schedule run. Use the View « Report Bar menu command to view the Report bar. The quickest way to run a report is to click on its name on the bar. LoadTest will display the report output. You can then save or delete the report.

You can also customize the report bar by populating it with your own reports.

The procedures for using the Report bar; running a report from the menu; viewing the log files; printing a report; copying, saving, and renaming a report; deleting reports; and customizing reports are found in Chapter 9 of the Using Rational LoadTest manual, which can be found on the documentation CD.

 

Copyright  ⌐ 1987 - 2000 Rational Software Corporation

Display Rational Unified Process using frames

Rational Unified Process